5,104 research outputs found

    Network of Tinkerers: A Model of Open-Source Technology Innovation

    Get PDF
    Airplanes were invented by hobbyists and experimenters, and some personal computers were as well. Similarly, many open-source software developers are interested in the software they make, and not focused on profit. Based on these cases, this paper has a model of agents called tinkerers who want to improve a technology for their own reasons, by their own criteria, and who see no way to profit from it. Under these conditions, they would rather share their technology than work alone. The members of the agreement form an information network. The network's members optimally specialize based on their opportunities in particular aspects of the technology or in expanding or managing the network. Endogenously there are incentives to standardize on designs and descriptions of the technology. A tinkerer in the network who sees an opportunity to produce a profitable product may exit the network to create a startup firm and conduct focused research and development. Thus a new industry can arise.Technological Change, Open Source Software, Uncertainty, Innovation, Invention, Collective Invention, Hackers, Hobbyists, Experimenters, Airplane

    Turbulence, Inequality, and Cheap Steel

    Get PDF
    Iron and steel production grew dramatically in the U.S. when mass production technologies for steel were adopted in the 1860s. According to new measures presented in this study, earnings inequality rose within the iron and steel industries about 1870, perhaps because technological uncertainty led to gambles and turbulence. Firms made a variety of technological choices and began formal research and development. Professional associations and journals for mechanical engineers and chemists appeared. A national market replaced local markets for iron and steel. An industrial union replaced craft unions. As new ore sources and cheap water transportation were introduced, new plants along the Great Lakes outcompeted existing plants elsewhere. Because new iron and steel plants in the 1870s were larger than any U.S. plants had ever been, cost accounting appeared in the industry and grew in importance. Uncertainty explains the rise in inequality better than a skill bias account, according to which differences among individuals generate greater differences in wages. Analogous issues of inequality come up with respect to recent information technology.technological change, Bessemer steel, technological uncertainty, turbulence, inequality, innovation

    Top-Quark Physics at the LHC

    Full text link
    The top quark is the heaviest of all known elementary particles. It was discovered in 1995 by the CDF and D0 experiments at the Tevatron. With the start of the LHC in 2009, an unprecedented wealth of measurements of the top quark's production mechanisms and properties have been performed by the ATLAS and CMS collaborations, most of these resulting in smaller uncertainties than those achieved previously. At the same time, huge progress was made on the theoretical side yielding significantly improved predictions up to next-to-next-to-leading order in perturbative QCD. Due to the vast amount of events containing top quarks, a variety of new measurements became feasible and opened a new window to precisions tests of the Standard Model and to contributions of new physics. In this review, originally written for a recent book on the results of LHC Run 1, top-quark measurements obtained so far from the LHC Run 1 are summarised and put in context with the current understanding of the Standard Model.Comment: 35 pages, 25 figures. To appear in "The Large Hadron Collider -- Harvest of Run 1", Thomas Sch\"orner-Sadenius (ed.), Springer, 2015 (532 pages, 253 figures; ISBN 978-3-319-15000-0; eBook ISBN 978-3-319-15001-7, for more details, see http://www.springer.com/de/book/9783319150000

    Proposed Category System for 1960-2000 Census Occupations

    Get PDF
    This paper proposes a detailed, consistent category system for occupations in the Census of Population data from 1960 to 2000. Most of the categories are based on the 1990 Census occupation definitions. We analyze employment levels, average earnings levels, and earnings variance in our occupation categories over time, compare these to similar trends for occupations defined in the occ1950 IPUMS classification, and test both classifications for consistency over time.occupations; jobs; classification; categories; metadata; Census; IPUMS

    Who had an occupation? Changing boundaries in historical U.S. census data

    Full text link
    'Das ursprĂŒngliche Ziel der U.S.-amerikanischen VolkszĂ€hlung war, Informationen zu erheben, die die Abgrenzung von Stimmbezirken von ungefĂ€hr gleicher BevölkerungsgrĂ¶ĂŸe ermöglichten. Heute werden die Zensusdaten immer hĂ€ufiger in der Forschung sekundĂ€ranalytisch genutzt, z.B. fĂŒr die Arbeits- und Berufsforschung. Der Autor geht daher der Frage nach, wie die Kategorie 'Beruf' sich im Lauf der Geschichte der amerikanischen VolkszĂ€hlung gewandelt hat und erörtert methodologische Probleme, die entstehen, wenn man mit Hilfe dieser Daten den historischen Wandel der amerikanischen Erwerbsbevölkerung untersucht. Der Autor zeigt, dass Begriffe, Erhebungspraktiken und der historische Kontext einen starken Einfluss darauf haben, welche und wie viele Personen einer Berufsgruppe zugeordnet wurden. Dies betrifft insbesondere bestimmte Personengruppen, etwa verheiratete Frauen, Indianer, Jugendliche und Personen, die aufgehört haben, gegen Bezahlung zu arbeiten.' (Autorenreferat)'The original official purpose of the U.S. Census was to gather information to design political districts of approximately the same size. Increasingly Census data has been used for descriptive and social scientific purposes. This paper examines how the category of 'occupation' has changed and looks at several issues which arise in comparing the present day workforce with the workforce in past decades. Changes in concepts, practices, and historical context have greatly affected how many persons were recorded as having occupations, especially for married women, American Indians, teenagers, and people who have ceased paid work.' (author's abstract

    Workplace Organization and Innovation

    Get PDF
    This study uses data on Canadian establishments to test whether particular organizational structures are correlated with the likelihood of adopting process and product innovations, controlling for the endogeneity of the predictors. We find that establishments with decentralized decision-making, information-sharing programs, or incentive pay plans are significantly more likely to innovate than other establishments. Larger establishments and those with a high vacancy rate are also more likely to innovate. These findings are consistent with a model in which workers hold information about production inefficiencies or consumer demands that can lead to productive innovations and that workplace organization attributes facilitate the communication and implementation of those ideas.Innovation, Decision-Making, Information-Sharing

    Financing Climate-Resilient Infrastructure: A Political-Economy Framework

    Get PDF
    Urban infrastructure investment is needed for both, mitigation of climate risks and improved urban resiliency. Financing them requires the translation of those benefits into measurable returns on investment in the context of emerging risks that capital markets can understand and appreciate. This paper develops a generic framework to identify what are the necessary and sufficient factors to economically favor climate-change resilient infrastructure in private investment decisions. We specifically demonstrate that carbon pricing alone will not generate the needed will, because market prices at present systematically fail to account for climate change risks such as the costs of stranded assets and the national and local co-benefits of investments in climate resiliency. Carbon pricing is necessary, but not sufficient for an enhanced private financing of climate-resilient infrastructure. The Paris Agreement and other supra-local policies and actors including city networks can concretely help to generate the sufficient social and political will for investments into climate change mitigation and resiliency at the city level

    Proteomic analysis of heart failure hospitalization among patients with chronic kidney disease: The Heart and Soul Study.

    Get PDF
    BACKGROUND:Patients with chronic kidney disease (CKD) are at increased risk for heart failure (HF). We aimed to investigate differences in proteins associated with HF hospitalizations among patients with and without CKD in the Heart and Soul Study. METHODS AND RESULTS:We measured 1068 unique plasma proteins from baseline samples of 974 participants in The Heart and Soul Study who were followed for HF hospitalization over a median of 7 years. We sequentially applied forest regression and Cox survival analyses to select prognostic proteins. Among participants with CKD, four proteins were associated with HF at Bonferroni-level significance (p<2.5x10(-4)): Angiopoietin-2 (HR[95%CI] 1.45[1.33, 1.59]), Spondin-1 (HR[95%CI] 1.13 [1.06, 1.20]), tartrate-resistant acid phosphatase type 5 (HR[95%CI] 0.65[0.53, 0.78]) and neurogenis locus notch homolog protein 1 (NOTCH1) (HR[95%CI] 0.67[0.55, 0.80]). These associations persisted at p<0.01 after adjustment for age, estimated glomerular filtration and history of HF. CKD was a significant interaction term in the associations of NOTCH1 and Spondin-1 with HF. Pathway analysis showed a trend for higher representation of the Cardiac Hypertrophy and Complement/Coagulation pathways among proteins prognostic of HF in the CKD sub-group. CONCLUSIONS:These results suggest that markers of heart failure differ between patients with and without CKD. Further research is needed to validate novel markers in cohorts of patients with CKD and adjudicated HF events

    Measuring Metacognition in Cancer: Validation of the Metacognitions Questionnaire 30 (MCQ-30)

    Get PDF
    Objective The Metacognitions Questionnaire 30 assesses metacognitive beliefs and processes which are central to the metacognitive model of emotional disorder. As recent studies have begun to explore the utility of this model for understanding emotional distress after cancer diagnosis, it is important also to assess the validity of the Metacognitions Questionnaire 30 for use in cancer populations. Methods 229 patients with primary breast or prostate cancer completed the Metacognitions Questionnaire 30 and the Hospital Anxiety and Depression Scale pre-treatment and again 12 months later. The structure and validity of the Metacognitions Questionnaire 30 were assessed using factor analyses and structural equation modelling. Results Confirmatory and exploratory factor analyses provided evidence supporting the validity of the previously published 5-factor structure of the Metacognitions Questionnaire 30. Specifically, both pre-treatment and 12 months later, this solution provided the best fit to the data and all items loaded on their expected factors. Structural equation modelling indicated that two dimensions of metacognition (positive and negative beliefs about worry) were significantly associated with anxiety and depression as predicted, providing further evidence of validity. Conclusions These findings provide initial evidence that the Metacognitions Questionnaire 30 is a valid measure for use in cancer populations

    Quantum Adiabatic Algorithms, Small Gaps, and Different Paths

    Full text link
    We construct a set of instances of 3SAT which are not solved efficiently using the simplest quantum adiabatic algorithm. These instances are obtained by picking random clauses all consistent with two disparate planted solutions and then penalizing one of them with a single additional clause. We argue that by randomly modifying the beginning Hamiltonian, one obtains (with substantial probability) an adiabatic path that removes this difficulty. This suggests that the quantum adiabatic algorithm should in general be run on each instance with many different random paths leading to the problem Hamiltonian. We do not know whether this trick will help for a random instance of 3SAT (as opposed to an instance from the particular set we consider), especially if the instance has an exponential number of disparate assignments that violate few clauses. We use a continuous imaginary time Quantum Monte Carlo algorithm in a novel way to numerically investigate the ground state as well as the first excited state of our system. Our arguments are supplemented by Quantum Monte Carlo data from simulations with up to 150 spins.Comment: The original version considered a unique satisfying assignment and one problematic low lying state. The revision argues that the algorithm with path change will succeed when there are polynomially many low lying state
    • 

    corecore